Optimizing for website performance includes setting long expiration dates on our static resources, such s images, stylesheets and JavaScript files. Doing that tells the browser to cache our files so it doesn’t have to request them every time the user loads a page. This is one of the most important things to do when optimizing websites.

In ASP.NET on IIS7+ it’s really easy. Just add this chunk of XML to the web.config’s <system.webServer> element:

<staticcontent>
  <clientcache cachecontrolmode="UseMaxAge" cachecontrolmaxage="365.00:00:00" />
</staticcontent>

The above code tells the browsers to automatically cache all static resources for 365 days. That’s good and you should do this right now.

The issue becomes clear the first time you make a change to any static file. How is the browser going to know that you made a change, so it can download the latest version of the file? The answer is that it can’t. It will keep serving the same cached version of the file for the next 365 days regardless of any changes you are making to the files.

Fingerprinting

The good news is that it is fairly trivial to make a change to our code, that changes the URL pointing to the static files and thereby tricking the browser into believing it’s a brand new resource that needs to be downloaded.

Here’s a little class that I use on several websites, that adds a fingerprint, or timestamp, to the URL of the static file.

using System; 
using System.IO; 
using System.Web; 
using System.Web.Caching; 
using System.Web.Hosting;

public class Fingerprint 
{ 
  public static string Tag(string rootRelativePath) 
  { 
    if (HttpRuntime.Cache[rootRelativePath] == null) 
    { 
      string absolute = HostingEnvironment.MapPath("~" + rootRelativePath);

      DateTime date = File.GetLastWriteTime(absolute); 
      int index = rootRelativePath.LastIndexOf('/');

      string result = rootRelativePath.Insert(index, "/v-" + date.Ticks); 
      HttpRuntime.Cache.Insert(rootRelativePath, result, new CacheDependency(absolute)); 
    }

      return HttpRuntime.Cache[rootRelativePath] as string; 
  } 
}

All you need to change in order to use this class, is to modify the references to the static files.

Modify references

Here’s what it looks like in Razor for the stylesheet reference:

<link rel="stylesheet" href="@Fingerprint.Tag("/content/site.css")" />

…and in WebForms:

<link rel="stylesheet" href="<%=Fingerprint.Tag(" />content/site.css") %>" />

The result of using the FingerPrint.Tag method will in this case be:

<link rel="stylesheet" href="/content/v-634933238684083941/site.css" />

Since the URL now has a reference to a non-existing folder (v-634933238684083941), we need to make the web server pretend it exist. We do that with URL rewriting.

URL rewrite

By adding this snippet of XML to the web.config’s <system.webServer> section, we instruct IIS 7+ to intercept all URLs with a folder name containing “v=[numbers]” and rewrite the URL to the original file path.

<rewrite>
  <rules>
    <rule name="fingerprint">
      <match url="([\S]+)(/v-[0-9]+/)([\S]+)" />
      <action type="Rewrite" url="{R:1}/{R:3}" />
    </rule>
  </rules>
</rewrite>

You can use this technique for all your JavaScript and image files as well.

The beauty is, that every time you change one of the referenced static files, the fingerprint will change as well. This creates a brand new URL every time so the browsers will download the updated files.

FYI, you need to run the AppPool in Integrated Pipeline mode for the <system.webServer> section to have any effect.

What's the address of your website? www.domain.com or domain.com?

There are two camps on the subject of the www subdomain. One believe it should be enforced (www.yes-www.org) and the other (no-www.org) that it should be removed. They are both right.

What's important is that there is only a single canonical address to your website – with or without www.

The web.config makes it easy for us to either enforce or remove the www subdomain using URL rewrites. There are many examples online on how to do this, but they all share 2 fundamental flaws. The rules have a direct dependency to the domain name and they don't work with both HTTP and HTTPS.

So let's see if we can create generic URL rewrite rules that can be used on any website without modifications.

Your server needs to have the URL Rewrite module installed. Chances are that it does already. Azure Websites does and so does all of my other hosting providers.

Rewrite rules need to be placed inside the <rewrite> element in web.config:

<system.webServer>
  <rewrite>
    <rules>
      <!-- My rules -->
    </rules>
  </rewrite>
</system.webServer>

So here are 2 rules that works on all domains and on both HTTP and HTTPS.

Remove WWW

This rule redirects any incoming request to www.domain.com to domain.com while preserving the HTTP(S) protocol:

<rule name="Remove WWW" patternSyntax="Wildcard" stopProcessing="true">
  <match url="*" />
  <conditions>
    <add input="{CACHE_URL}" pattern="*://www.*" />
  </conditions>
  <action type="Redirect" url="{C:1}://{C:2}" redirectType="Permanent" />
</rule>

Enforce WWW

This rule redirects any incoming request to domain.com to www.domain.com while preserving the HTTP(S) protocol:

<rule name="Enforce WWW" stopProcessing="true">
  <match url=".*" />
  <conditions>
    <add input="{CACHE_URL}" pattern="^(.+)://(?!www)(.*)" />
  </conditions>
  <action type="Redirect" url="{C:1}://www.{C:2}" redirectType="Permanent" />
</rule>

So there you have it. It's easy once you now how.

For more info on the URL Rewrite Module, see the Configuration Reference.